26 research outputs found

    On the Security of Lattice-Based Signature Schemes in a Post-Quantum World

    Get PDF
    Digital signatures are indispensable for security on the Internet, because they guarantee authenticity, integrity, and non-repudiation, of namely e-mails, software updates, and in the Transport Layer Security (TLS) protocol which is used for secure data transfer, for example. Most signature schemes that are currently in use such as the RSA signature scheme, are considered secure as long as the integer factorization problem or the discrete logarithm (DL) problem are computationally hard. At present, no algorithms have yet been found to solve these problems on conventional computers in polynomial time. However, in 1997, Shor published a polynomial-time algorithm that uses quantum computation to solve the integer factorization and the DL problem. In particular, this means that RSA signatures are considered broken as soon as large-scale quantum computers exist. Due to significant advances in the area of quantum computing, it is reasonable to assume that within 20 years, quantum computers that are able to break the RSA scheme, could exist. In order to maintain authenticity, integrity, and non-repudiation of data, cryptographic schemes that cannot be broken by quantum attacks are required. In addition, these so-called post-quantum secure schemes should be sufficiently efficient to be suitable for all established applications. Furthermore, solutions enabling a timely and secure transition from classical to post-quantum schemes are needed. This thesis contributes to the above-mentioned transition. In this thesis, we present the two lattice-based digital signature schemes TESLA and qTESLA, whereby lattice-based cryptography is one of five approaches to construct post-quantum secure schemes. Furthermore, we prove that our signature schemes are secure as long as the so-called Learning With Errors (LWE) problem is computationally hard to solve. It is presumed that even quantum computers cannot solve the LWE problem in polynomial time. The security of our schemes is proven using security reductions. Since our reductions are tight and explicit, efficient instantiations are possible that provably guarantee a selected security level, as long as the corresponding LWE instance provides a certain hardness level. Since both our reductions (as proven in the quantum random oracle model) and instantiations, take into account quantum attackers, TESLA and qTESLA are considered post-quantum secure. Concurrently, the run-times for generating and verifying signatures of qTESLA are similar (or faster) than those of the RSA scheme. However, key and signature sizes of RSA are smaller than those of qTESLA. In order to protect both the theoretical signature schemes and their implementations against attacks, we analyze possible vulnerabilities against implementation attacks. In particular, cache-side-channel attacks resulting from observing the cache behavior and fault attacks, which recover secret information by actively disrupting the execution of an algorithm are focused. We present effective countermeasures for each implementation attack we found. Our analyses and countermeasures also influence the design and implementation of qTESLA. Although our schemes are considered (post-quantum) secure according to state-of-the-art LWE attacks, cryptanalysis of lattice-based schemes is still a relatively new field of research in comparison to RSA schemes. Hence, there is a lack of confidence in the concrete instantiations and their promised security levels. However, due to developments within the field of quantum computers, a transition to post-quantum secure solutions seems to be more urgently required than ever. To solve this dilemma, we present an approach to combine two schemes, e.g., qTESLA and the RSA signature scheme, so that the combination is secure as long as one of the two combined schemes is secure. We present several of such combiners to construct hybrid signature schemes and hybrid key encapsulation mechanisms to ensure both authenticity and confidentiality in our Public-Key Infrastructure (PKI). Lastly, we also demonstrate how to apply the resulting hybrid schemes in standards such as X.509 or TLS. To summarize, this work presents post-quantum secure candidates which can, using our hybrid schemes, add post-quantum security to the current classical security in our PKI

    A Note on Hybrid Signature Schemes

    Get PDF
    This draft presents work-in-progress concerning hybrid/composite signature schemes. More concretely, we give several tailored combinations of Fiat-Shamir based signature schemes (such as Dilithium) or Falcon with RSA or DSA. We observe that there are a number of signature hybridization goals, few of which are not achieved through parallel signing or concatenation approaches. These include proof composability (that the post-quantum hybrid signature security can easily be linked to the component algorithms), weak separability, strong separability, backwards compatibility, hybrid generality (i.e., hybrid compositions that can be instantiated with different algorithms once proven to be secure), and simultaneous verification. We do not consider backwards compatibility in this work, but aim in our constructions to show the feasibility of achieving all other properties. As a work-in-progress, the constructions are presented without the accompanying formal security analysis, to be included in an update

    Decryption failure is more likely after success

    Get PDF
    The user of an imperfectly correct lattice-based public-key encryption scheme leaks information about their secret key with each decryption query that they answer---even if they answer all queries successfully. Through a refinement of the D\u27Anvers--Guo--Johansson--Nilsson--Vercauteren--Verbauwhede failure boosting attack, we show that an adversary can use this information to improve his odds of finding a decryption failure. We also propose a new definition of δ\delta-correctness, and we re-assess the correctness of several submissions to NIST\u27s post-quantum standardization effort

    FIDO2, CTAP 2.1, and WebAuthn 2: Provable Security and Post-Quantum Instantiation

    Get PDF
    The FIDO2 protocol is a globally used standard for passwordless authentication, building on an alliance between major players in the online authentication space. While already widely deployed, the standard is still under active development. Since version 2.1 of its CTAP sub-protocol, FIDO2 can potentially be instantiated with post-quantum secure primitives. We provide the first formal security analysis of FIDO2 with the CTAP 2.1 and WebAuthn 2 sub-protocols. Our security models build on work by Barbosa et al. for their analysis of FIDO2 with CTAP 2.0 and WebAuthn 1, which we extend in several ways. First, we provide a more fine-grained security model that allows us to prove more relevant protocol properties, such as guarantees about token binding agreement, the None attestation mode, and user verification. Second, we can prove post-quantum security for FIDO2 under certain conditions and minor protocol extensions. Finally, we show that for some threat models, the downgrade resilience of FIDO2 can be improved, and show how to achieve this with a simple modification

    Quantum Lattice Enumeration in Limited Depth

    Get PDF
    In 2018, Aono et al. (ASIACRYPT 2018) proposed to use quantum backtracking algorithms (Montanaro, TOC 2018; Ambainis and Kokainis, STOC 2017) to speedup lattice point enumeration. Quantum lattice sieving algorithms had already been proposed (Laarhoven et al., PQCRYPTO 2013), being shown to provide an asymptotic speedup over classical counterparts, but also to lose competitivity at relevant dimensions to cryptography if practical considerations on quantum computer architecture were taken into account (Albrecht et al., ASIACRYPT 2020). Aono et al.’s work argued that quantum walk speedups can be applied to lattice enumeration, achieving at least a quadratic asymptotic speedup à la Grover search while not requiring exponential amounts of quantum accessible classical memory, as it is the case for sieving. In this work, we explore how to lower bound the cost of using Aono et al.’s techniques on lattice enumeration with extreme cylinder pruning assuming a limit to the maximum depth that a quantum computation can achieve without decohering, with the objective of better understanding the practical applicability of quantum backtracking in lattice cryptanalysis

    Drive (Quantum) Safe! – Towards PQ Authentication for V2V Communications

    Get PDF
    We tackle a challenging problem at the intersection of two emerging technologies: post-quantum cryptography (PQC) and vehicle-to-vehicle (V2V) communication with its strict requirements. We are the first to devise and evaluate a practical, provably secure design for integrating PQ authentication into the IEEE 1609.2 V2V security ecosystem. By theoretically and empirically analyzing the three PQ signature algorithms selected for standardization by NIST, as well as XMSS (RFC 8391), we propose a Partially Hybrid design—a tailored fusion of classical cryptography and PQC—for use during the nascent transition period to PQC. As opposed to a direct substitution of PQC for classical cryptography, our design meets the unique constraints of standardized V2V protocols

    Light the Signal: Optimization of Signal Leakage Attacks against LWE-Based Key Exchange

    Get PDF
    Key exchange protocols from the learning with errors (LWE) problem share many similarities with the Diffie–Hellman–Merkle (DHM) protocol, which plays a central role in securing our Internet. Therefore, there has been a long time effort in designing authenticated key exchange directly from LWE to mirror the advantages of DHM-based protocols. In this paper, we revisit signal leakage attacks and show that the severity of these attacks against LWE-based (authenticated) key exchange is still underestimated. In particular, by converting the problem of launching a signal leakage attack into a coding problem, we can significantly reduce the needed number of queries to reveal the secret key. Specifically, for DXL-KE we reduce the queries from 1,266 to only 29, while for DBS-KE, we need only 748 queries, a great improvement over the previous 1,074,434 queries. Moreover, our new view of signals as binary codes enables recognizing vulnerable schemes more easily. As such we completely recover the secret key of a password-based authenticated key exchange scheme by Dabra et al. with only 757 queries and partially reveal the secret used in a two-factor authentication by Wang et al. with only one query. The experimental evaluation supports our theoretical analysis and demonstrates the efficiency and effectiveness of our attacks. Our results caution against underestimating the power of signal leakage attacks as they are applicable even in settings with a very restricted number of interactions between adversary and victim

    Batch Signatures, Revisited

    Get PDF
    We revisit batch signatures (previously considered in a draft RFC, and used in multiple recent works), where a single, potentially expensive, inner digital signature authenticates a Merkle tree constructed from many messages. We formalise a construction and prove its unforgeability and privacy properties. We also show that batch signing allows us to scale slow signing algorithms, such as those recently selected for standardisation as part of NIST\u27s post-quantum project, to high throughput, with a mild increase in latency. For the example of Falcon-512 in TLS, we can increase the amount of connections per second by a factor 3.2x, at the cost of an increase in the signature size by ~14% and the median latency by ~25%, where both are ran on the same 30 core server. We also discuss applications where batch signatures allow us to increase throughput and to save bandwidth. For example, again for Falcon-512, once one batch signature is available, the additional bandwidth for each of the remaining N-1 is only 82 bytes

    Ein deutsches digitales Signaturverfahren auf dem Weg zum internationalen kryptographischen Standard

    Get PDF
    Hver gang vi samler mennesker innebærer dette et element av risiko. Det er ikke mangel på ulykker der store menneskeansamlinger er involvert eller grunnen til dødsfall og skader. Internasjonalt og i Norge ser vi en betraktelig årlig økning i musikkfestivalers popularitet. Det som særpreger en musikkfestival er at den ofte er midlertidig, og tilbyr en opplevelse utenom det vanlige. Den økende trenden, stadige ulykker og dens midlertidighet skaper behovet for å undersøke sikkerheten på musikkfestivaler. Det analytiske rammeverket i studien bygger på teorier til sikkerhetsstyring og myndighetenes påvirkning for å trygge musikkfestivaler. Problemstillingen studien søker å besvare er: Hvordan påvirker myndighetskrav musikkfestivalarrangører sitt sikkerhetsstyringsarbeid? Gjennom dokumentanalyse og deltakende observasjon har denne studien et mål om å bidra med ny kunnskap om hvordan lover og forskrifter påvirker sikkerheten til en musikkfestival. Studien identifiserer det relevante lovverket og ser på betydningen av formuleringer og myndighetstilsyn, og hvordan arrangør blir påvirket av dette. Funnene tyder på at sikkerhetsstyringen blir mest påvirket av det spesifikke lovverket, men at dette omfatter en for snever del av sikkerhetsutfordringene. Det resterende generelle lovverket er utformet så generelt at arrangør må selv tolke og implementere sikkerhetstiltak. Ettersom det generelle lovverket ikke blir kontrollert, er sikkerhetsstyringen på musikkfestivaler derfor svært personavhengig
    corecore